165 research outputs found

    Theta-point polymers in the plane and Schramm-Loewner evolution

    Get PDF
    We study the connection between polymers at the theta temperature on the lattice and Schramm-Loewner chains with constant step length in the continuum. The latter realize a useful algorithm for the exact sampling of tricritical polymers, where finite-chain effects are excluded. The driving function computed from the lattice model via a radial implementation of the zipper method is shown to converge to Brownian motion of diffusivity kappa=6 for large times. The distribution function of an internal portion of walk is well approximated by that obtained from Schramm-Loewner chains. The exponent of the correlation length nu and the leading correction-to scaling exponent Delta_1 measured in the continuum are compatible with nu=4/7 (predicted for the theta point) and Delta_1=72/91 (predicted for percolation). Finally, we compute the shape factor and the asphericity of the chains, finding surprising accord with the theta-point end-to-end values.Comment: 8 pages, 6 figure

    A Parafermionic Generalization of the Jaynes Cummings Model

    Get PDF
    We introduce a parafermionic version of the Jaynes Cummings Hamiltonian, by coupling kk Fock parafermions (nilpotent of order FF) to a 1D harmonic oscillator, representing the interaction with a single mode of the electromagnetic field. We argue that for k=1k=1 and F≀3F\leq 3 there is no difference between Fock parafermions and quantum spins s=F−12s=\frac{F-1}{2}. We also derive a semiclassical approximation of the canonical partition function of the model by assuming ℏ\hbar to be small in the regime of large enough total number of excitations nn, where the dimension of the Hilbert space of the problem becomes constant as a function of nn. We observe in this case an interesting behaviour of the average of the bosonic number operator showing a single crossover between regimes with different integer values of this observable. These features persist when we generalize the parafermionic Hamiltonian by deforming the bosonic oscillator with a generic function Ί(x)\Phi(x); the q−q-deformed bosonic oscillator corresponds to a specific choice of the deformation function Ί\Phi. In this particular case, we observe at most k(F−1)k(F-1) crossovers in the behavior of the mean bosonic number operator, suggesting a phenomenology of superradiance similar to the k−k-atoms Jaynes Cummings model.Comment: to appear on J.Phys.

    Soft bounds on diffusion produce skewed distributions and Gompertz growth

    Get PDF
    Constraints can affect dramatically the behavior of diffusion processes. Recently, we analyzed a natural and a technological system and reported that they perform diffusion-like discrete steps displaying a peculiar constraint, whereby the increments of the diffusing variable are subject to configuration-dependent bounds. This work explores theoretically some of the revealing landmarks of such phenomenology, termed "soft bound". At long times, the system reaches a steady state irreversibly (i.e., violating detailed balance), characterized by a skewed "shoulder" in the density distribution, and by a net local probability flux, which has entropic origin. The largest point in the support of the distribution follows a saturating dynamics, expressed by the Gompertz law, in line with empirical observations. Finally, we propose a generic allometric scaling for the origin of soft bounds. These findings shed light on the impact on a system of such "scaling" constraint and on its possible generating mechanisms.Comment: 9 pages, 6 color figure

    Counting the learnable functions of structured data

    Get PDF
    Cover's function counting theorem is a milestone in the theory of artificial neural networks. It provides an answer to the fundamental question of determining how many binary assignments (dichotomies) of pp points in nn dimensions can be linearly realized. Regrettably, it has proved hard to extend the same approach to more advanced problems than the classification of points. In particular, an emerging necessity is to find methods to deal with structured data, and specifically with non-pointlike patterns. A prominent case is that of invariant recognition, whereby identification of a stimulus is insensitive to irrelevant transformations on the inputs (such as rotations or changes in perspective in an image). An object is therefore represented by an extended perceptual manifold, consisting of inputs that are classified similarly. Here, we develop a function counting theory for structured data of this kind, by extending Cover's combinatorial technique, and we derive analytical expressions for the average number of dichotomies of generically correlated sets of patterns. As an application, we obtain a closed formula for the capacity of a binary classifier trained to distinguish general polytopes of any dimension. These results may help extend our theoretical understanding of generalization, feature extraction, and invariant object recognition by neural networks

    Balancing building and maintenance costs in growing transport networks

    Get PDF
    The costs associated to the length of links impose unavoidable constraints to the growth of natural and artificial transport networks. When future network developments can not be predicted, building and maintenance costs require competing minimization mechanisms, and can not be optimized simultaneously. Hereby, we study the interplay of building and maintenance costs and its impact on the growth of transportation networks through a non-equilibrium model of network growth. We show cost balance is a sufficient ingredient for the emergence of tradeoffs between the network's total length and transport effciency, of optimal strategies of construction, and of power-law temporal correlations in the growth history of the network. Analysis of empirical ant transport networks in the framework of this model suggests different ant species may adopt similar optimization strategies.Comment: 4 pages main text, 2 pages references, 4 figure

    Beyond the storage capacity: data driven satisfiability transition

    Full text link
    Data structure has a dramatic impact on the properties of neural networks, yet its significance in the established theoretical frameworks is poorly understood. Here we compute the Vapnik-Chervonenkis entropy of a kernel machine operating on data grouped into equally labelled subsets. At variance with the unstructured scenario, entropy is non-monotonic in the size of the training set, and displays an additional critical point besides the storage capacity. Remarkably, the same behavior occurs in margin classifiers even with randomly labelled data, as is elucidated by identifying the synaptic volume encoding the transition. These findings reveal aspects of expressivity lying beyond the condensed description provided by the storage capacity, and they indicate the path towards more realistic bounds for the generalization error of neural networks.Comment: 5 pages, 2 figure

    Intermittent transport of bacterial chromosomal loci

    Get PDF
    The short-time dynamics of bacterial chromosomal loci is a mixture of subdiffusive and active motion, in the form of rapid relocations with near-ballistic dynamics. While previous work has shown that such rapid motions are ubiquitous, we still have little grasp on their physical nature, and no positive model is available that describes them. Here, we propose a minimal theoretical model for loci movements as a fractional Brownian motion subject to a constant but intermittent driving force, and compare simulations and analytical calculations to data from high-resolution dynamic tracking in E. coli. This analysis yields the characteristic time scales for intermittency. Finally, we discuss the possible shortcomings of this model, and show that an increase in the effective local noise felt by the chromosome associates to the active relocations.Comment: 8 pages, 6 figures; typos added, introduction expanded, conclusions unchange

    Generalization from correlated sets of patterns in the perceptron

    Full text link
    Generalization is a central aspect of learning theory. Here, we propose a framework that explores an auxiliary task-dependent notion of generalization, and attempts to quantitatively answer the following question: given two sets of patterns with a given degree of dissimilarity, how easily will a network be able to "unify" their interpretation? This is quantified by the volume of the configurations of synaptic weights that classify the two sets in a similar manner. To show the applicability of our idea in a concrete setting, we compute this quantity for the perceptron, a simple binary classifier, using the classical statistical physics approach in the replica-symmetric ansatz. In this case, we show how an analytical expression measures the "distance-based capacity", the maximum load of patterns sustainable by the network, at fixed dissimilarity between patterns and fixed allowed number of errors. This curve indicates that generalization is possible at any distance, but with decreasing capacity. We propose that a distance-based definition of generalization may be useful in numerical experiments with real-world neural networks, and to explore computationally sub-dominant sets of synaptic solutions
    • 

    corecore